A Feasible Directions Method for Nonsmooth Convex Optimization

نویسندگان

  • Jose Herskovits
  • Wilhelm P. Freire
  • Mario Tanaka
چکیده

We propose a new technique for minimization of convex functions not necessarily smooth. Our approach employs an equivalent constrained optimization problem and approximated linear programs obtained with cutting planes. At each iteration a search direction and a step length are computed. If the step length is considered “non serious”, a cutting plane is added and a new search direction is computed. This procedure is repeated until a “serious” step is obtained. When this happens, the search direction is a feasible descent direction of the constrained equivalent problem. The search directions are computed with FDIPA, the Feasible Directions Interior Point Algorithm. We prove global convergence and solve several test problems very efficiently.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Bundle Method for a Class of Bilevel Nonsmooth Convex Minimization Problems

We consider the bilevel problem of minimizing a nonsmooth convex function over the set of minimizers of another nonsmooth convex function. Standard convex constrained optimization is a particular case in this framework, corresponding to taking the lower level function as a penalty of the feasible set. We develop an explicit bundle-type algorithm for solving the bilevel problem, where each itera...

متن کامل

On Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions

Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

Interior Epigraph Directions method for nonsmooth and nonconvex optimization via generalized augmented Lagrangian duality

We propose and study a new method, called the Interior Epigraph Directions (IED) method, for solving constrained nonsmooth and nonconvex optimization. The IED method considers the dual problem induced by a generalized augmented Lagrangian duality scheme, and obtains the primal solution by generating a sequence of iterates in the interior of the dual epigraph. First, a deflected subgradient (DSG...

متن کامل

Approximate Level Method for Nonsmooth Convex Minimization

In this paper, we propose and analyse an approximate variant of the level method of Lemaréchal, Nemirovskii and Nesterov for minimizing nonsmooth convex functions. The main per-iteration work of the level method is spent on (i) minimizing a piecewise-linear model of the objective function and (ii) projecting onto the intersection of the feasible region and a level set of the model function. We ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009